On estimating the alphabet size of a discrete random source
نویسنده
چکیده
We are concerned with estimating alphabet size N from a stream of symbols taken uniformly at random from that alphabet. We define and analyze a memory-restricted variant of an algorithm that have been earlier proposed for this purpose. The alphabet size N can be estimated in O( √ N) time and space by the memory-restricted variant of this algorithm.
منابع مشابه
Discrete Denoising for Channels with Memory
We consider the problem of estimating a discrete signal X = (X1, . . . , Xn) based on its noise-corrupted observation signal Z = (Z1, . . . , Zn). The noise-free, noisy, and reconstruction signals are all assumed to have components taking values in the same finite M -ary alphabet {0, . . . , M − 1}. For concreteness we focus on the additive noise channel Zi = Xi + Ni, where addition is modulo-M...
متن کاملXRD-DFA Analysis of data on discrete units of micro-molecules: Applied on C52H60N4S8Zn2
X-ray diffraction (XRD) technique due to it's strength for providing visualized atomic model of molecules, stands as first in the field of characterization of new synthesized molecules. Here, the analysis of diffraction data on discrete units of new, not yet fairly known structure of molecule $rm C_{52} H_{60} N_4 S_8 Zn_2$ is performed to obtain a quantitative view of the structure of the syst...
متن کاملDISCRETE SIZE AND DISCRETE-CONTINUOUS CONFIGURATION OPTIMIZATION METHODS FOR TRUSS STRUCTURES USING THE HARMONY SEARCH ALGORITHM
Many methods have been developed for structural size and configuration optimization in which cross-sectional areas are usually assumed to be continuous. In most practical structural engineering design problems, however, the design variables are discrete. This paper proposes two efficient structural optimization methods based on the harmony search (HS) heuristic algorithm that treat both discret...
متن کاملAnalogy and duality between random channel coding and lossy source coding
Here we write in a unified fashion (using “R(P,Q,D)” [1]) the random coding exponents in channel coding and lossy source coding. We derive their explicit forms and show, that, for a given random codebook distribution Q, the channel decoding error exponent can be viewed as an encoding success exponent in lossy source coding, and the channel correct-decoding exponent can be viewed as an encoding ...
متن کاملRenyi Entropy Estimation Revisited
We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order α, up to constant accuracy and error probability, we show the following Upper bounds n = O(1) · 2(1− 1 α )Hα for integer α...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1711.07545 شماره
صفحات -
تاریخ انتشار 2017